Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
                                            Some full text articles may not yet be available without a charge during the embargo (administrative interval).
                                        
                                        
                                        
                                            
                                                
                                             What is a DOI Number?
                                        
                                    
                                
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
- 
            Intraspecific variation in morphology and behavior is widespread, especially in species with large distribution ranges. This includes foraging which can vary according to the local resource landscape. How this may be linked to differences in social structure, especially in socially foraging species is less known. Greater spear-nosed bats are well known for their large repertoire of often highly complex social behaviors. In Trinidad, they form stable groups of unrelated females that recruit other members to temporally unpredictable flowering balsa trees. We compared these findings with a dataset of capture data, GPS tracks, and observations collected over six years in a colony in Panamá. We found profound differences in the foraging behavior and group stability ofPhyllostomus hastatusduring the dry season where social behaviors were expected. Female bats did not coordinate commutes to exploit distinct foraging resources as a group. Instead, females commuted individually to very distant foraging areas which overlapped between groups. Linked to this we found groups to be unstable in size over the short and long term. Our findings highlight the large intraspecific variation and indicate a strong influence of the local resource landscape and associated benefits of social foraging on the social structure in these bats and possibly many other animals.more » « lessFree, publicly-accessible full text available March 20, 2026
- 
            Using unmanned aerial vehicles (UAVs) to track multiple individuals simultaneously in their natural environment is a powerful approach for better understanding the collective behavior of primates. Previous studies have demonstrated the feasibility of automating primate behavior classification from video data, but these studies have been carried out in captivity or from ground-based cameras. However, to understand group behavior and the self-organization of a collective, the whole troop needs to be seen at a scale where behavior can be seen in relation to the natural environment in which ecological decisions are made. To tackle this challenge, this study presents a novel dataset for baboon detection, tracking, and behavior recognition from drone videos where troops are observed on-the-move in their natural environment as they move to and from their sleeping sites. Videos were captured from drones at Mpala Research Centre, a research station located in Laikipia County, in central Kenya. The baboon detection dataset was created by manually annotating all baboons in drone videos with bounding boxes. A tiling method was subsequently applied to create a pyramid of images at various scales from the original 5.3K resolution images, resulting in approximately 30K images used for baboon detection. The baboon tracking dataset is derived from the baboon detection dataset, where bounding boxes are consistently assigned the same ID throughout the video. This process resulted in half an hour of dense tracking data. The baboon behavior recognition dataset was generated by converting tracks into mini-scenes, a video subregion centered on each animal. These mini-scenes were annotated with 12 distinct behavior types and one additional category for occlusion, resulting in over 20 hours of data. Benchmark results show mean average precision (mAP) of 92.62% for the YOLOv8-X detection model, multiple object tracking precision (MOTP) of 87.22% for the DeepSORT tracking algorithm, and micro top-1 accuracy of 64.89% for the X3D behavior recognition model. Using deep learning to rapidly and accurately classify wildlife behavior from drone footage facilitates non-invasive data collection on behavior enabling the behavior of a whole group to be systematically and accurately recorded. The dataset can be accessed at https://baboonland.xyz.more » « lessFree, publicly-accessible full text available June 16, 2026
- 
            All foraging animals face a trade-off: how much time should they invest in exploitation of known resources versus exploration to discover new resources? For group-living central place foragers, this balance is challenging. Due to the nature of their movement patterns, exploration and exploitation are often mutually exclusive, while the availability of social information may discourage individuals from exploring. To examine these trade-offs, we GPS-tracked groups of greater spear-nosed bats (Phyllostomus hastatus) from three colonies on Isla Colón, Panamá. During the dry season, when these omnivores forage on the nectar of unpredictable balsa flowers, bats consistently travelled long distances to remote, colony-specific foraging areas, bypassing flowering trees closer to their roosts. They continued using these areas in the wet season, when feeding on a diverse, presumably ubiquitous diet, but also visited other, similarly distant foraging areas. Foraging areas were shared within but not always between colonies. Our longitudinal dataset suggests that bats from each colony invest in long-distance commutes to socially learned shared foraging areas, bypassing other available food patches. Rather than exploring nearby resources, these bats exploit colony-specific foraging locations that appear to be culturally transmitted. These results give insight into how social animals might diverge from optimal foraging.more » « lessFree, publicly-accessible full text available December 1, 2025
- 
            Abstract Drones are increasingly popular for collecting behaviour data of group‐living animals, offering inexpensive and minimally disruptive observation methods. Imagery collected by drones can be rapidly analysed using computer vision techniques to extract information, including behaviour classification, habitat analysis and identification of individual animals. While computer vision techniques can rapidly analyse drone‐collected data, the success of these analyses often depends on careful mission planning that considers downstream computational requirements—a critical factor frequently overlooked in current studies.We present a comprehensive summary of research in the growing AI‐driven animal ecology (ADAE) field, which integrates data collection with automated computational analysis focused on aerial imagery for collective animal behaviour studies. We systematically analyse current methodologies, technical challenges and emerging solutions in this field, from drone mission planning to behavioural inference. We illustrate computer vision pipelines that infer behaviour from drone imagery and present the computer vision tasks used for each step. We map specific computational tasks to their ecological applications, providing a framework for future research design.Our analysis reveals AI‐driven animal ecology studies for collective animal behaviour using drone imagery focus on detection and classification computer vision tasks. While convolutional neural networks (CNNs) remain dominant for detection and classification tasks, newer architectures like transformer‐based models and specialized video analysis networks (e.g. X3D, I3D, SlowFast) designed for temporal pattern recognition are gaining traction for pose estimation and behaviour inference. However, reported model accuracy varies widely by computer vision task, species, habitats and evaluation metrics, complicating meaningful comparisons between studies.Based on current trends, we conclude semi‐autonomous drone missions will be increasingly used to study collective animal behaviour. While manual drone operation remains prevalent, autonomous drone manoeuvrers, powered by edge AI, can scale and standardise collective animal behavioural studies while reducing the risk of disturbance and improving data quality. We propose guidelines for AI‐driven animal ecology drone studies adaptable to various computer vision tasks, species and habitats. This approach aims to collect high‐quality behaviour data while minimising disruption to the ecosystem.more » « less
- 
            Abstract Understanding the amount of space required by animals to fulfill their biological needs is essential for comprehending their behavior, their ecological role within their community, and for effective conservation planning and resource management. The space-use patterns of habituated primates often are studied by using handheld GPS devices, which provide detailed movement information that can link patterns of ranging and space-use to the behavioral decisions that generate these patterns. However, these data may not accurately represent an animal’s total movements, posing challenges when the desired inference is at the home range scale. To address this problem, we used a 13-year dataset from 11 groups of white-faced capuchins (Cebus capucinus imitator) to examine the impact of sampling elements, such as sample size, regularity, and temporal coverage, on home range estimation accuracy. We found that accurate home range estimation is feasible with relatively small absolute sample sizes and irregular sampling, as long as the data are collected over extended time periods. Also, concentrated sampling can lead to bias and overconfidence due to uncaptured variations in space use and underlying movement behaviors. Sampling protocols relying on handheld GPS for home range estimation are improved by maximizing independent location data distributed across time periods much longer than the target species’ home range crossing timescale.more » « less
- 
            Over the past five decades, a large number of wild animals have been individually identified by various observation systems and/or temporary tracking methods, providing unparalleled insights into their lives over both time and space. However, so far there is no comprehensive record of uniquely individually identified animals nor where their data and metadata are stored, for example photos, physiological and genetic samples, disease screens, information on social relationships.Databases currently do not offer unique identifiers for living, individual wild animals, similar to the permanent ID labelling for deceased museum specimens.To address this problem, we introduce two new concepts: (1) a globally unique animal ID (UAID) available to define uniquely and individually identified animals archived in any database, including metadata archived at the time of publication; and (2) the digital ‘home’ for UAIDs, the Movebank Life History Museum (MoMu), storing and linking metadata, media, communications and other files associated with animals individually identified in the wild. MoMu will ensure that metadata are available for future generations, allowing permanent linkages to information in other databases.MoMu allows researchers to collect and store photos, behavioural records, genome data and/or resightings of UAIDed animals, encompassing information not easily included in structured datasets supported by existing databases. Metadata is uploaded through the Animal Tracker app, the MoMu website, by email from registered users or through an Application Programming Interface (API) from any database. Initially, records can be stored in a temporary folder similar to a field drawer, as naturalists routinely do. Later, researchers and specialists can curate these materials for individual animals, manage the secure sharing of sensitive information and, where appropriate, publish individual life histories with DOIs. The storage of such synthesized lifetime stories of wild animals under a UAID (unique identifier or ‘animal passport’) will support basic science, conservation efforts and public participation.more » « less
- 
            Abstract Inexpensive and accessible sensors are accelerating data acquisition in animal ecology. These technologies hold great potential for large-scale ecological understanding, but are limited by current processing approaches which inefficiently distill data into relevant information. We argue that animal ecologists can capitalize on large datasets generated by modern sensors by combining machine learning approaches with domain knowledge. Incorporating machine learning into ecological workflows could improve inputs for ecological models and lead to integrated hybrid modeling tools. This approach will require close interdisciplinary collaboration to ensure the quality of novel approaches and train a new generation of data scientists in ecology and conservation.more » « less
- 
            Abstract Conservation funding is currently limited; cost‐effective conservation solutions are essential. We suggest that the thousands of field stations worldwide can play key roles at the frontline of biodiversity conservation and have high intrinsic value. We assessed field stations’ conservation return on investment and explored the impact of COVID‐19. We surveyed leaders of field stations across tropical regions that host primate research; 157 field stations in 56 countries responded. Respondents reported improved habitat quality and reduced hunting rates at over 80% of field stations and lower operational costs per km2than protected areas, yet half of those surveyed have less funding now than in 2019. Spatial analyses support field station presence as reducing deforestation. These “earth observatories” provide a high return on investment; we advocate for increased support of field station programs and for governments to support their vital conservation efforts by investing accordingly.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
